Nonlinear Manifold Learning
نویسنده
چکیده
The aim of this report is to study the two nonlinear manifold learning techniques recently proposed (Isomap [12] and Locally Linear Embedding (LLE) [8]) and suggest directions for further research. First the Isomap and the LLE algorithm are discussed in detail. Some of the areas that need further work are pointed out. A few novel applications which could use these two algorithms have been discussed. 1. Problem Statement Manifold learning can be viewed as implicity inverting a generative model for a given set of observations. Let Y be a d dimensional domain contained in a Euclidean space R. Let f : Y → R be a smooth embedding for some D > d. The goal is to recover Y and f given N points in R. Isomap [12] and LLE [8] provide implicit description of the mapping f . Given X = {xi ∈ R | i = 1 . . . N} find Y = {yi ∈ R | i = 1 . . . N} such that {xi = f(yi) | i = 1 . . . N}. For example, consider the swiss roll data set which is used in many of the papers. The swiss roll is generated by the following equations x1 = y1cos y1; x2 = y1sin y1; x3 = y2; y1 ∈ [3π/2, 9π/2]; y2 ∈ [0, 15]. Based on a set of x1,x2 and x3 we have to find y1 and y2. Note that we are implicitly inverting the generative model without explicit parametrization of the generative function f . 2. Types of Embedding Without imposing any restrictions of f the problem is ill-posed. The simplest case is a linear isometry i.e. f is a linear mapping from R → R where D > d. In this case Principal Component Analysis (PCA) recovers the d significant dimensions of the observed data. Classical Multidimensional Scaling (MDS) produces the same results but uses the pairwise distance matrix instead of the actual coordinates. Two other possibilities are considered in [6, 5]. f can be either a isometric embedding or a conformal embedding. An isometric embedding preserves infinitesimal lengths and angles while a conformal embedding preserves only infinitesimal angles (it does not preserve lengths). In case of conformal embedding at every point y ∈ Y there is a scalar s(y) > 0 such that the infinitesimal vector at y gets magnified by a factor s(y). In case of isometric embedding s(y) = 1. One way to visualize these two embeddings is to consider a 2D rubber sheet. The rubber sheet can be curved and folded such that it gets embedded in a 3D space e.g. the rubber sheet could be folded into a S shaped curve. If we fold the rubber sheet without stretching it we have an isometric embedding. If we stretch the rubber sheet with stretching This report was written as a part of the course CMSC828J: Approaches to representing and recognizing objects offered in FALL 2003 by Dr. David Jacobs.
منابع مشابه
بهبود مدل تفکیککننده منیفلدهای غیرخطی بهمنظور بازشناسی چهره با یک تصویر از هر فرد
Manifold learning is a dimension reduction method for extracting nonlinear structures of high-dimensional data. Many methods have been introduced for this purpose. Most of these methods usually extract a global manifold for data. However, in many real-world problems, there is not only one global manifold, but also additional information about the objects is shared by a large number of manifolds...
متن کاملآموزش منیفلد با استفاده از تشکیل گراف منیفلدِ مبتنی بر بازنمایی تنک
In this paper, a sparse representation based manifold learning method is proposed. The construction of the graph manifold in high dimensional space is the most important step of the manifold learning methods that is divided into local and gobal groups. The proposed graph manifold extracts local and global features, simultanstly. After construction the sparse representation based graph manifold,...
متن کاملThe Connection Between Manifold Learning and Distance Metric Learning
Manifold Learning learns a low-dimensional embedding of the latent manifold. In this report, we give the definition of distance metric learning, provide the categorization of manifold learning, and describe the essential connection between manifold learning and distance metric learning, with special emphasis on nonlinear manifold learning, including ISOMAP, Laplacian Eigenamp (LE), and Locally ...
متن کاملRandom Projections for Manifold Learning: Proofs and Analysis
We derive theoretical bounds on the performance of manifold learning algorithms, given access to a small number of random projections of the input dataset. We prove that with the number of projections only logarithmic in the size of the original space, we may reliably learn the structure of the nonlinear manifold, as compared to performing conventional manifold learning on the full dataset.
متن کاملNonlinear Manifold Representations for Functional Data
For functional data lying on an unknown nonlinear low-dimensional space, we study manifold learning and introduce the notions of manifold mean, manifold modes of functional variation and of functional manifold components. These constitute nonlinear representations of functional data that complement classical linear representations such as eigenfunctions and functional principal components. Our ...
متن کاملEvaluation of the Centre Manifold Method for Limit Cycle Calculations of a Nonlinear Structural Wing
In this study the centre manifold is applied for reduction and limit cycle calculation of a highly nonlinear structural aeroelastic wing. The limit cycle is arisen from structural nonlinearity due to the large deflection of the wing. Results obtained by different orders of centre manifolds are compared with those obtained by time marching method (fourth-order Runge-Kutta method). These comparis...
متن کامل